Low-rank updates of matrix functions
نویسندگان
چکیده
Abstract. We consider the task of updating a matrix function f(A) when the matrix A ∈ Cn×n is subject to a low-rank modification. In other words, we aim at approximating f(A+D)− f(A) for a matrix D of rank k n. The approach proposed in this paper attains efficiency by projecting onto tensorized Krylov subspaces produced by matrix-vector multiplications with A and A∗. We prove the approximations obtained from m steps of the proposed methods are exact if f is a polynomial of degree at most m and use this as a basis for proving a variety of convergence results, in particular for the matrix exponential and for Markov functions. We illustrate the performance of our method by considering various examples from network analysis, where our approach can be used to cheaply update centrality and communicability measures.
منابع مشابه
The Radau-Lanczos Method for Matrix Functions
Analysis and development of restarted Krylov subspace methods for computing f(A)b have proliferated in recent years. We present an acceleration technique for such methods when applied to Stieltjes functions f and Hermitian positive definite matrices A. This technique is based on a rank-one modification of the Lanczos matrix derived from a connection between the Lanczos process and Gauss–Radau q...
متن کاملFace Recognition Based Rank Reduction SVD Approach
Standard face recognition algorithms that use standard feature extraction techniques always suffer from image performance degradation. Recently, singular value decomposition and low-rank matrix are applied in many applications,including pattern recognition and feature extraction. The main objective of this research is to design an efficient face recognition approach by combining many tech...
متن کاملRank Modifications of Semi-definite Matrices with Applications to Secant Updates
The BFGS and DFP updates are perhaps the most successful Hessian and inverse Hessian approximations respectively for unconstrained minimization problems. This paper describes these methods in terms of two successive steps: rank reduction and rank restoration. From rank subtractivity and a powerful spectral result, the first step must necessarily result in a positive semidefinite matrix; and the...
متن کاملFixed-Rank Approximation of a Positive-Semidefinite Matrix from Streaming Data
Several important applications, such as streaming PCA and semidefinite programming, involve a large-scale positive-semidefinite (psd) matrix that is presented as a sequence of linear updates. Because of storage limitations, it may only be possible to retain a sketch of the psd matrix. This paper develops a new algorithm for fixed-rank psd approximation from a sketch. The approach combines the N...
متن کاملApplying Schur Complements for Handling General Updates of a Large, Sparse, Unsymmetric Matrix
We describe a set of procedures for computing and updating an inverse representation of a large and sparse unsymmetric matrix A. The representation is built of two matrices: an easily invertible, large and sparse matrix A0 and a dense Schur complement matrix S. An e cient heuristic is given that nds this representation for any matrix A and keeps the size of S as small as possible. Equations wit...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1707.03045 شماره
صفحات -
تاریخ انتشار 2017